122 research outputs found
Infinitely many fast homoclinic solutions for some second-order nonautonomous systems
We investigate the existence of infinitely many fast homoclinic solutions for a class of second-order nonautonomous systems. Our main tools are based on the variant fountain theorem. A criterion guaranteeing that the second-order system has infinitely many fast homoclinic solutions is obtained. Recent results from the literature are generalized and significantly improved.Досліджєно існування нескінченної кількості швидких гомоклінічних розв'язків для класу неавтономних систем другого порядку. Наш основний метод базується на модифікації теореми про фонтан. Отримано критерій, що гарантує наявність нескінченної кількості швидких гомоклінічних розв'язків системи другого порядку. Узагальнено та значно покращено нещодавно опубліковані результати
Positive Periodic Solutions for Impulsive Functional Differential Equations with Infinite Delay and Two Parameters
We apply the Krasnoselskii's fixed point theorem to study the existence of multiple positive periodic solutions for a class of impulsive functional differential equations with infinite delay and two parameters. In particular, the presented criteria improve and generalize some related results in the literature. As an application, we study some special cases of systems, which have been studied extensively in the literature
Global Positive Periodic Solutions for Periodic Two-Species Competitive Systems with Multiple Delays and Impulses
A set of easily verifiable sufficient conditions are derived to guarantee the existence and the global stability of positive periodic solutions for two-species competitive systems with multiple delays and impulses, by applying some new analysis techniques. This improves and extends a series of the well-known sufficiency theorems in the literature about the problems mentioned previously
Existence and Stability of Positive Periodic Solutions for a Neutral Multispecies Logarithmic Population Model with Feedback Control and Impulse
We investigate a neutral multispecies logarithmic population model with feedback control and impulse. By applying the contraction mapping principle and some inequality techniques, a set of easily applicable criteria for the existence, uniqueness, and global attractivity of positive periodic solution are established. The conditions we obtained are weaker than the previously known ones and can be easily reduced to several special cases. We also give an example to illustrate the applicability of our results
Global Positive Periodic Solutions of Generalized n
By applying the fixed point theorem in a cone of Banach space, we obtain an easily verifiable necessary and sufficient condition for the existence of positive periodic solutions of two kinds of generalized n-species competition systems with multiple delays and impulses as follows:
xi′(t)=xi(t)[ai(t)-bi(t)xi(t)-∑j=1ncij(t)xi(t-τij(t))-∑j=1ndij(t)xj(t-γij(t))-∑j=1neij(t)∫-σij0fij(s)xj(t+s)ds], a.e., t>0,t≠tk, k∈Z+, i=1,2,…,n; xi(tk+)-xi(tk-)=θikxi(tk), i=1,2,…,n, k∈Z+; and xi′(t)=xi(t)[ai(t)-bi(t)xi(t)+∑j=1ncij(t)xi(t-τij(t))-∑j=1ndij(t)xj(t-γij(t))-∑j=1neij(t)∫-σij0fij(s)xj(t+s)ds], a.e., t>0, t≠tk, k∈Z+, i=1,2,…,n; xi(tk+)-xi(tk-)=θikxi(tk), i=1,2,…,n, k∈Z+. It improves and generalizes a series of the well-known sufficiency theorems in the literature about the problems mentioned previously
Global Positive Periodic Solutions of Generalized n
We consider the following generalized n-species Lotka-Volterra type and Gilpin-Ayala type competition systems with multiple delays and impulses: xi′(t)=xi(t)[ai(t)-bi(t)xi(t)-∑j=1ncij(t)xjαij(t-ρij(t))-∑j=1ndij(t)xjβij(t-τij(t))-∑j=1neij(t)∫-ηij0kij(s)xjγij(t+s)ds-∑j=1nfij(t)∫-θij0Kij(ξ)xiδij(t+ξ)xjσij(t+ξ)dξ],a.e, t>0, t≠tk; xi(tk+)-xi(tk-)=hikxi(tk), i=1,2,…,n, k∈Z+. By applying the Krasnoselskii fixed-point theorem in a cone of Banach space, we derive some verifiable necessary and sufficient conditions for the existence of positive periodic solutions of the previously mentioned. As applications, some special cases of the previous system are examined and some earlier results are extended and improved
Training Energy-Based Models with Diffusion Contrastive Divergences
Energy-Based Models (EBMs) have been widely used for generative modeling.
Contrastive Divergence (CD), a prevailing training objective for EBMs, requires
sampling from the EBM with Markov Chain Monte Carlo methods (MCMCs), which
leads to an irreconcilable trade-off between the computational burden and the
validity of the CD. Running MCMCs till convergence is computationally
intensive. On the other hand, short-run MCMC brings in an extra non-negligible
parameter gradient term that is difficult to handle. In this paper, we provide
a general interpretation of CD, viewing it as a special instance of our
proposed Diffusion Contrastive Divergence (DCD) family. By replacing the
Langevin dynamic used in CD with other EBM-parameter-free diffusion processes,
we propose a more efficient divergence. We show that the proposed DCDs are both
more computationally efficient than the CD and are not limited to a
non-negligible gradient term. We conduct intensive experiments, including both
synthesis data modeling and high-dimensional image denoising and generation, to
show the advantages of the proposed DCDs. On the synthetic data learning and
image denoising experiments, our proposed DCD outperforms CD by a large margin.
In image generation experiments, the proposed DCD is capable of training an
energy-based model for generating the Celab-A dataset, which is
comparable to existing EBMs
Diff-Instruct: A Universal Approach for Transferring Knowledge From Pre-trained Diffusion Models
Due to the ease of training, ability to scale, and high sample quality,
diffusion models (DMs) have become the preferred option for generative
modeling, with numerous pre-trained models available for a wide variety of
datasets. Containing intricate information about data distributions,
pre-trained DMs are valuable assets for downstream applications. In this work,
we consider learning from pre-trained DMs and transferring their knowledge to
other generative models in a data-free fashion. Specifically, we propose a
general framework called Diff-Instruct to instruct the training of arbitrary
generative models as long as the generated samples are differentiable with
respect to the model parameters. Our proposed Diff-Instruct is built on a
rigorous mathematical foundation where the instruction process directly
corresponds to minimizing a novel divergence we call Integral Kullback-Leibler
(IKL) divergence. IKL is tailored for DMs by calculating the integral of the KL
divergence along a diffusion process, which we show to be more robust in
comparing distributions with misaligned supports. We also reveal non-trivial
connections of our method to existing works such as DreamFusion, and generative
adversarial training. To demonstrate the effectiveness and universality of
Diff-Instruct, we consider two scenarios: distilling pre-trained diffusion
models and refining existing GAN models. The experiments on distilling
pre-trained diffusion models show that Diff-Instruct results in
state-of-the-art single-step diffusion-based models. The experiments on
refining GAN models show that the Diff-Instruct can consistently improve the
pre-trained generators of GAN models across various settings
- …